EACOFT: An energy-aware correlation filter for visual tracking
نویسندگان
چکیده
• Energy-aware-correlation-filter tracker to adaptively adjust the target for tracking. New strategy reject low quality samples and ensure model discriminant ability. Combining bottom-up top-down optimal training robust Outperform many state-of-the-art trackers on several challenging datasets. Correlation filter based attribute its calculation in frequency domain can efficiently locate targets a relatively fast speed. This characteristic however also limits generalization some specific scenarios. The reasons that they still fail achieve superior performance (SOTA) are possibly due two main aspects. first is while tracking objects whose energy lower than background, may occur drift or even lose target. second biased be inevitably selected training, which easily lead inaccurate To tackle these shortcomings, novel energy-aware correlation (EACOFT) method proposed, our approach between foreground background balanced, enables of interest always having higher background. samples’ qualities evaluated real time, ensures used template helpful with In addition, we propose an combined plays important role improving both effectiveness robustness As result, achieves great improvement basis baseline tracker, especially under clutter motion challenges. Extensive experiments over multiple benchmarks demonstrate proposed methodology comparison number SOTA trackers.
منابع مشابه
Scene-Aware Adaptive Updating for Visual Tracking via Correlation Filters
In recent years, visual object tracking has been widely used in military guidance, human-computer interaction, road traffic, scene monitoring and many other fields. The tracking algorithms based on correlation filters have shown good performance in terms of accuracy and tracking speed. However, their performance is not satisfactory in scenes with scale variation, deformation, and occlusion. In ...
متن کاملAttentional Correlation Filter Network for Adaptive Visual Tracking <Supplementary Material>
To show the effect of the parameters used in the Attentional Correlation Filter Network (ACFN), two additional experiments were conducted. In the first experiment, we varied the number of selected tracking modules (Na) in order to validate the robustness of the attentional mechanism, as shown in Fig. 2 (a). For this experiment, the number of tracking modules with high predicted validation score...
متن کاملLearning Background-Aware Correlation Filters for Visual Tracking - Supplementary Material
Spatial size of training samples: We evaluated the performance of our tracker over a range of different spatial support sizes on the OTB50 dataset, as shown in Table 1. We set the spatial size of training samples to be N2 times bigger than the target, where N 2 [2, ..., 5]. This experiment shows that increasing the support size improves the overlap precision, since more background patches are u...
متن کاملCFNN: Correlation Filter Neural Network for Visual Object Tracking
Albeit convolutional neural network (CNN) has shown promising capacity in many computer vision tasks, applying it to visual tracking is yet far from solved. Existing methods either employ a large external dataset to undertake exhaustive pre-training or suffer from less satisfactory results in terms of accuracy and robustness. To track single target in a wide range of videos, we present a novel ...
متن کاملParticle Filter Re-detection for Visual Tracking via Correlation Filters
Most of the correlation filter based tracking algorithms can achieve good performance and maintain fast computational speed. However, in some complicated tracking scenes, there is a fatal defect that causes the object to be located inaccurately. In order to address this problem, we propose a particle filter redetection based tracking approach for accurate object localization. During the trackin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition
سال: 2021
ISSN: ['1873-5142', '0031-3203']
DOI: https://doi.org/10.1016/j.patcog.2020.107766